翻訳と辞書
Words near each other
・ Multilevel
・ Multilevel fast multipole method
・ Multilevel feedback queue
・ Multilevel model
・ Multilevel modeling for repeated measures
・ Multilevel queue
・ MultiLevel Recording
・ Multilevel security
・ Multilevel streets in Chicago
・ Multiline Optical Character Reader
・ Multilineal evolution
・ Multilinear algebra
・ Multilinear form
・ Multilinear map
・ Multilinear polynomial
Multilinear principal component analysis
・ Multilinear subspace learning
・ Multilingual Education
・ Multilingual notation
・ Multilingual society
・ Multilingual titling
・ Multilingual User Interface
・ Multilingualism
・ Multilingualism in Luxembourg
・ Multilink
・ Multilink Procedure
・ Multilink striping
・ MultiLisp
・ Multiliteracy
・ Multilobular tumour of bone


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Multilinear principal component analysis : ウィキペディア英語版
Multilinear principal component analysis

Multilinear principal component analysis (MPCA)
〔 M. A. O. Vasilescu, D. Terzopoulos (2002) ("Multilinear Analysis of Image Ensembles: TensorFaces" ), Proc. 7th European Conference on Computer Vision (ECCV'02), Copenhagen, Denmark, May, 2002〕〔 M. A. O. Vasilescu, D. Terzopoulos (2003) ("Multilinear Subspace Analysis of Image Ensembles" ), "Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR’03), Madison, WI, June, 2003"〕
〔 M. Alex O. Vasilescu (2002) ("Human Motion Signatures: Analysis, Synthesis, Recognition" ), "Proceedings of the International Conference on Pattern Recognition (ICPR’02)", Quebec City, Canada, August, 2002〕
〔H. Lu, K. N. Plataniotis, and A. N. Venetsanopoulos, (2008) ("MPCA: Multilinear principal component analysis of tensor objects" ), ''IEEE Trans. Neural Netw.'', 19 (1), 18–39〕 is a mathematical procedure that uses multiple orthogonal transformations to convert a set of multidimensional objects into another set of multidimensional objects of lower dimensions. There is one orthogonal (linear) transformation for each dimension (mode); hence ''multilinear''. This transformation aims to capture as high a variance as possible, accounting for as much of the variability in the data as possible, subject to the constraint of mode-wise orthogonality.
MPCA is a multilinear extension of principal component analysis (PCA). The major difference is that PCA needs to reshape a multidimensional object into a vector, while MPCA operates directly on multidimensional objects through mode-wise processing. For example, for 100x100 images, PCA operates on vectors of 10000x1 while MPCA operates on vectors of 100x1 in two modes. For the same amount of dimension reduction, PCA needs to estimate 49
*(10000/(100
*2)-1) times more parameters than MPCA. Thus, MPCA is more efficient and better conditioned in practice.
MPCA is a basic algorithm for dimension reduction via multilinear subspace learning. In wider scope, it belongs to tensor-based computation. Its origin can be traced back to the Tucker decomposition in 1960s and it is closely related to higher-order singular value decomposition,〔L.D. Lathauwer, B.D. Moor, J. Vandewalle (2000) ("A multilinear singular value decomposition" ), ''SIAM Journal of Matrix Analysis and Applications'', 21 (4), 1253–1278〕 (HOSVD) and to the best rank-(R1, R2, ..., RN ) approximation of higher-order tensors.〔L. D. Lathauwer, B. D. Moor, J. Vandewalle (2000) ("On the best rank-1 and rank-(R1, R2, ..., RN ) approximation of higher-order tensors" ), ''SIAM Journal of Matrix Analysis and Applications'' 21 (4), 1324–1342.〕
== The algorithm ==
MPCA performs feature extraction by determining a multilinear projection that captures most of the original tensorial input variations. As in PCA, MPCA works on centered data. The MPCA solution follows the alternating least square (ALS) approach.〔P. M. Kroonenberg and J. de Leeuw, (Principal component analysis of three-mode data by means of alternating least squares algorithms ), Psychometrika, 45 (1980), pp. 69–97.〕 Thus, is iterative in nature and it proceeds by decomposing the original problem to a series of multiple projection subproblems. Each subproblem is a classical PCA problem, which can be easily solved.
It should be noted that while PCA with orthogonal transformations produces uncorrelated features/variables, this is not the case for MPCA. Due to the nature of tensor-to-tensor transformation, MPCA features are not uncorrelated in general although the transformation in each mode is orthogonal.〔H. Lu, K. N. Plataniotis, and A. N. Venetsanopoulos, "(Uncorrelated multilinear principal component analysis for unsupervised multilinear subspace learning )," IEEE Trans. Neural Netw., vol. 20, no. 11, pp. 1820–1836, Nov. 2009.〕 In contrast, the uncorrelated MPCA (UMPCA) generates uncorrelated multilinear features.〔

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Multilinear principal component analysis」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.